Alternating Direction Method of Multipliers for Generalized Low-Rank Tensor Recovery
نویسندگان
چکیده
Abstract: Low-Rank Tensor Recovery (LRTR), the higher order generalization of Low-Rank Matrix Recovery (LRMR), is especially suitable for analyzing multi-linear data with gross corruptions, outliers and missing values, and it attracts broad attention in the fields of computer vision, machine learning and data mining. This paper considers a generalized model of LRTR and attempts to recover simultaneously the low-rank, the sparse, and the small disturbance components from partial entries of a given data tensor. Specifically, we first describe generalized LRTR as a tensor nuclear norm optimization problem that minimizes a weighted combination of the tensor nuclear norm, the l1-norm and the Frobenius norm under linear constraints. Then, the technique of Alternating Direction Method of Multipliers (ADMM) is employed to solve the proposed minimization problem. Next, we discuss the weak convergence of the proposed iterative algorithm. Finally, experimental results on synthetic and real-world datasets validate the efficiency and effectiveness of the proposed method.
منابع مشابه
Tensor completion and low-n-rank tensor recovery via convex optimization
In this paper we consider sparsity on a tensor level, as given by the n-rank of a tensor. In the important sparse-vector approximation problem (compressed sensing) and the low-rank matrix recovery problem, using a convex relaxation technique proved to be a valuable solution strategy. Here, we will adapt these techniques to the tensor setting. We use the n-rank of a tensor as sparsity measure an...
متن کاملExact Tensor Completion from Sparsely Corrupted Observations via Convex Optimization
This paper conducts a rigorous analysis for provable estimation of multidimensional arrays, in particular third-order tensors, from a random subset of its corrupted entries. Our study rests heavily on a recently proposed tensor algebraic framework in which we can obtain tensor singular value decomposition (t-SVD) that is similar to the SVD for matrices, and define a new notion of tensor rank re...
متن کاملA Nuclear Norm Minimization Algorithm with Application
In this paper we present a new algorithm to reconstruct prestack (5D) seismic data. If one considers seismic data at a given frequency and, for instance, in the x midpoint, y midpoint, offset and azimuth domain, the data volume can be represented via a 4th order tensor. Seismic data reconstruction can be posed as a tensor completion problem where it is assumed that the fully sampled data can be...
متن کاملRecover the lost Phasor Measurement Unit Data Using Alternating Direction Multipliers Method
This paper presents a novel algorithm for recovering missing data of phasor measurement units (PMUs). Due to the low-rank property of PMU data, missing measurement recovery can be formulated as a low-rank matrix-completion problem. Based on maximum-margin matrix factorization, we propose an efficient algorithm based on alternating direction method of multipliers (ADMM) for solving the matrix co...
متن کاملLow-Rank Tensor Completion by Truncated Nuclear Norm Regularization
Currently, low-rank tensor completion has gained cumulative attention in recovering incomplete visual data whose partial elements are missing. By taking a color image or video as a three-dimensional (3D) tensor, previous studies have suggested several definitions of tensor nuclear norm. However, they have limitations and may not properly approximate the real rank of a tensor. Besides, they do n...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Algorithms
دوره 9 شماره
صفحات -
تاریخ انتشار 2016